Incremental Parsing by Modular Recurrent Connectionist Networks

نویسندگان

  • Ajay N. Jain
  • Alexander H. Waibel
چکیده

We present a novel, modular, recurrent connectionist network architecture which learns to robustly perform incremental parsing of complex sentences. From sequential input, one word at a time, our networks learn to do semantic role assignment, noun phrase attachment, and clause structure recognition for sentences with passive constructions and center embedded clauses. The networks make syntactic and semantic predictions at every point in time, and previous predictions are revised as expectations are affirmed or violated with the arrival of new information. Our networks induce their own "grammar rules" for dynamically transforming an input sequence of words into a syntactic/semantic interpretation. These networks generalize and display tolerance to input which has been corrupted in ways common in spoken language.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A Modular Connectionist Parser for Resolution of Pronominal Anaphoric References in Multiple Sentences

In this work a connectionist model used in the resolution of a well-known linguistic phenomenon as pronominal anaphoric reference is presented. The model is composed of two neural networks: a simple recurrent neural network (parser) and a feedforward neural network (segmenter). These networks are trained and tested simultaneously. With this model it is possible to solve anaphoric references wit...

متن کامل

Parsing Complex Sentences with Structured Connectionist Networks

A modular, recurrent connectionist network is taught to incrementally parse complex sentences. From input presented one word at a time, the network learns to do semantic role assignment, noun phrase attachment, and clause structure recognition, for sentences with both active and passive constructions and center-embedded clauses. The network makes syntactic and semantic predictions at every step...

متن کامل

Connectionist-Inspired Incremental PCFG Parsing

Probabilistic context-free grammars (PCFGs) are a popular cognitive model of syntax (Jurafsky, 1996). These can be formulated to be sensitive to human working memory constraints by application of a right-corner transform (Schuler, 2009). One side-effect of the transform is that it guarantees at most a single expansion (push) and at most a single reduction (pop) during a syntactic parse. The pri...

متن کامل

A Constructive Approach to Parsing with Neural Networks - The Hybrid Connectionist Parsing Method

The concept of Dynamic Neural Networks (DNN) is a new approach within the Neural Network paradigm, which is based on the dynamic construction of Neural Networks during the processing of an input. The DNN methodology has been employed in the Hybrid Connectionist Parsing (HCP) approach, which comprises an incremental, on-line generation of a Neural Network parse tree. The HCP ensures an adequate ...

متن کامل

Learning grammars with recurrent neural networks

Most of language acquisition models that have been constructed so far are based on traditional AI approaches. On the other hand, artificial neural networks (ANNs), contrasting with traditional AI approaches, have many great abilities such as ability to learn, generalization capability and robustness. But they are poor in representing compositional structures and manipulating them, and are consi...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 1989